Bayesian adaptation of hidden layers in Boolean feedforward neural networks

نویسندگان

  • Wolfgang Utschick
  • Josef A. Nossek
چکیده

In this paper a statistical point of view of feedforwared neural networks is presented. The hidden layer of a multilayer perceptrokneural network is identified of representing the mapping of random vectors. Utilizing hard limiter activation functions, the second and all further layers of the multilayer perceptron, including the output layer; represent the mapping of a boolean function. Boolean type of neural networks are naturally appropriate for categorization of input data. Training is exclusively carried out on the first layer of the neural network, whereas the definition of the boolean function generally remains a matter of experience or due to considerations of symmetry. In this work a method is introduced, how to adapt the booleanfunction of the network, utilizing statistical knowledge of the internal representation of input data. Applied to the classiJcation problem of greylevel bitmaps of handwritten characters the misclassijication rate of the neural network is approximately reduced by 20%.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Belief Networks Are Compact Universal Approximators

Deep Belief Networks (DBN) are generative models with many layers of hidden causal variables, recently introduced by Hinton et al. (2006), along with a greedy layer-wise unsupervised learning algorithm. Building on Le Roux and Bengio (2008) and Sutskever and Hinton (2008), we show that deep but narrow generative networks do not require more parameters than shallow ones to achieve universal appr...

متن کامل

A new strategy for adaptively constructing multilayer feedforward neural networks

In this paper a new strategy for adaptively and autonomously constructing a multi-hidden-layer feedforward neural network (FNN) is introduced. The proposed scheme belongs to a class of structure level adaptation algorithms that adds both new hidden units and new hidden layers one at a time when it is determined to be needed. Using this strategy, a FNN may be constructed having as many hidden la...

متن کامل

Role of Function Complexity and Network Size in the Generalization Ability of Feedforward Networks

The generalization ability of different sizes architectures with one and two hidden layers trained with backpropagation combined with early stopping have been analyzed. The dependence of the generalization process on the complexity of the function being implemented is studied using a recently introduced measure for the complexity of Boolean functions. For a whole set of Boolean symmetric functi...

متن کامل

Revising Bayesian Network Parameters Using Backpropagation

The problem of learning Bayesian networks with hidden variables is known to be a hard problem. Even the simpler task of learning just the conditional probabilities on a Bayesian network with hidden variables is hard. In this paper, we present an approach that learns the conditional probabilities on a Bayesian network with hidden variables by transforming it into a multi-layer feedforward neural...

متن کامل

Function Approximation by Random Neural Networks with a Bounded Number of Layers

This paper discusses the function approximation properties of the Gelenbe random neural network GNN We use two extensions of the basic model the bipolar GNN BGNN and the clamped GNN CGNN We limit the networks to being feedforward and consider the case where the number of hidden layers does not exceed the number of input layers With these constraints we show that the feedforward CGNN and the BGN...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1996